David L Miller
data.frame with \( x, y, \text{Depth}, A \)predict()preds <- predict(my_model, newdat=my_data, type="response")
(se.fit=TRUE gives a standard error for each prediction)
dolphin_preds <- predict(dolphins_depth, newdata=preddata,
type="response")
(ggplot2 code included in the slide source)
data.frame)type=... argument!se.fit
\( \boldsymbol{\lambda} \): uncertainty in the smoothing parameter
(Traditionally we've only addressed the former)
(New tools let us address the latter…)
From theory:
\[ \boldsymbol{\beta} \sim N(\hat{\boldsymbol{\beta}}, \mathbf{V}_\boldsymbol{\beta}) \]
(caveat: the normality is only approximate for non-normal response)
What does this mean? Variance for each parameter.
In mgcv: vcov(model) returns \( \mathbf{V}_\boldsymbol{\beta} \).
plotse.fitFor regular predictions:
\[ \hat{\boldsymbol{\eta}}_p = L_p \hat{\boldsymbol{\beta}} \]
form \( L_p \) using the prediction data, evaluating basis functions as we go.
(Need to apply the link function to \( \hat{\boldsymbol{\eta}}_p \))
But the \( L_p \) fun doesn't stop there…
To get variance on the scale of the linear predictor:
\[ V_{\hat{\boldsymbol{\eta}}} = L_p^\text{T} V_\hat{\boldsymbol{\beta}} L_p \]
pre-/post-multiplication shifts the variance matrix from parameter space to linear predictor-space.
(Can then pre-/post-multiply by derivatives of the link to put variance on response scale)
$Vp what we got with vcov$Vc the corrected versionmgcv does most of the hard work for uspredict is your friendmgcv sheilds you from